翻訳と辞書
Words near each other
・ Jointer plane
・ Jointing
・ Jointing (sharpening)
・ Jointly Administered Knowledge Environment
・ Jointness
・ Jointness (psychodynamics)
・ Joints & Jam
・ Joints of hand
・ Jointure
・ Joinup collaboration platform
・ Joinville
・ Joinville (disambiguation)
・ Joinville Basquete Associados
・ Joint Publications Research Service
・ Joint Publishing
Joint quantum entropy
・ Joint railway
・ Joint Rapid Reaction Force
・ Joint Readiness Training Center
・ Joint Region Marianas
・ Joint Regional Correctional Facility
・ Joint Regional Information Exchange System
・ Joint Regional Intelligence Center
・ Joint replacement
・ Joint Replacement Aircraft
・ Joint replacement of the hand
・ Joint replacement registry
・ Joint Requirements Oversight Council
・ Joint Rescue Coordination Centre Halifax
・ Joint Rescue Coordination Centre of Northern Norway


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Joint quantum entropy : ウィキペディア英語版
Joint quantum entropy
The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states \rho and \sigma, represented as density operators that are subparts of a quantum system, the joint quantum entropy is a measure of the total uncertainty or entropy of the joint system. It is written S(\rho,\sigma) or H(\rho,\sigma), depending on the notation being used for the von Neumann entropy. Like other entropies, the joint quantum entropy is measured in bits, i.e. the logarithm is taken in base 2.
In this article, we will use S(\rho,\sigma) for the joint quantum entropy.
==Background==

In information theory, for any classical random variable X, the classical Shannon entropy H(X) is a measure of how uncertain we are about the outcome of X. For example, if X is a probability distribution concentrated at one point, the outcome of X is certain and therefore its entropy H(X)=0. At the other extreme, if X is the uniform probability distribution with n possible values, intuitively one would expect X is associated with the most uncertainty. Indeed such uniform probability distributions have maximum possible entropy H(X) = \log_2(n).
In quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices. For a state \rho, the von Neumann entropy is defined by
:- \operatorname \rho \log \rho.
Applying the spectral theorem, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy S(\rho) (or sometimes H(\rho).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Joint quantum entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.